![]()
专利摘要:
The invention relates to a mobile terminal (100) and a control method thereof. The present invention includes a camera (121), a display unit (151) configured to display a preview image inputted by the camera (121), and a controller (180) configured to select an object in the preview image in response to user input to touch the object or define a specific region including the object, and to produce an alarm when the object exits the preview image. 公开号:FR3021132A1 申请号:FR1554204 申请日:2015-05-11 公开日:2015-11-20 发明作者:Younghoon Jeung 申请人:LG Electronics Inc; IPC主号:
专利说明:
[0001] The present invention relates to a mobile terminal, and more particularly, a mobile terminal and a control method thereof. Although the present invention is suitable for a wide range of applications, it is particularly suitable for producing an alarm in case of deviation of a subject from a preview image. Terminals can generally be classified into mobile / portable terminals or fixed terminals based on their mobility. Mobile terminals can also be classified into portable or telminal devices embedded according to whether or not they can be worn directly by a user. [0002] Mobile terminals have become more and more functional. Examples of these functions include voice and data communications, image and video capture by means of a camera, audio recording, music file playback using a speaker system, and audio playback. display of images and video on a display. Some mobile terminals include a complementary feature that supports games, while other terminals are configured as media players. More recently, mobile terminals have been configured to receive broadcast and multicast signals that allow viewing of content such as videos and television programs. [0003] Efforts are being made to support and increase the functionality of mobile terminals. These efforts include software and hardware enhancements as well as changes and improvements to the structural components. In particular, as mentioned in the preceding description, a user can photograph an image (for example, a photo, a video, etc.) that he wishes to record by means of a mobile terminal. By doing so, if a camera of the mobile terminal is activated for photography, the mobile terminal can produce a preview image of a target, that the user wants to photograph, through a display unit. And the user is able to predict a photo or video, about to be taken, thanks to the preview image. [0004] Consider a case where a user is facing a camera. Thus, if a front camera exposed in the same direction of a display unit is activated, a user can easily check if a subject of interest deviates from a preview image while looking at the display unit. . However, if a rear camera exposed in a direction opposite to that of the display unit is activated, since a user is unable to view the display unit, it is difficult for him to check whether the subject of interest deviates from the preview image. Thus, although a photo or video is taken, the fact that a subject of interest to the user can not be taken correctly may cause a problem. [0005] In addition, while a photo or video is taken, when a person or object passes through, if a subject is obscured by the person or object, a user may not be able to interrupt the shot despite its confirmation through the preview screen or a current shooting screen. Thus, the subject may not be taken correctly. After taking, the user must also perform a correction process to eliminate the obstruction. Accordingly, embodiments of the present invention relate to a mobile terminal and a control method thereof that substantially eliminate one or more problems related to the limitations and disadvantages of the related art. [0006] An object of the present invention is to provide a mobile terminal and a control method thereof, to improve the comfort of the user. An object of the present invention is in particular to provide a mobile terminal and a control method thereof, whereby an alarm can be generated in case of deviation of a subject from a preview image. [0007] Other advantages, objects and features of the invention will be set forth in the disclosure set forth herein and in the accompanying drawings. Such aspects may also be appreciated by those skilled in the art on the basis of the disclosure set forth herein. To obtain these objects and other advantages and according to the object of the invention, as implemented and widely described herein, a mobile terminal according to the present invention can include a camera, a display unit configured to display an input preview image through the camera, and a controller configured to select an object in the preview image in response to user input to touch the object and define a specific region including the object, and produce an alarm when the object exits the preview image. In another aspect of the present invention, as practiced and extensively described herein, a mobile terminal according to the present invention may include a camera, a display unit configured to display a preview image input through the camera, and a controller configured to extract identification information from an object from a previously taken picture, and if the identification information extracted from the object matches the preview image inputted through from the camera, command the camera to take a new picture automatically. In another aspect of the present invention, as practiced and extensively described herein, a method of controlling a mobile terminal according to the present invention may include displaying a preview image input through a camera, selecting an object in response to a user input to touch the object or defining a specific region including the object, and generating an alarm when the object exits the preview image . The effects obtainable from the present invention may be unrestricted by the aforementioned effect. Other effects not mentioned can also be clearly understood from the following description by one skilled in the art to which the present invention relates. It should be understood that the foregoing general description as well as the detailed description hereinafter are provided by way of example and explanation and are intended to provide a further explanation of the invention as claimed. The present invention will be better understood from the following detailed description and accompanying drawings, which are provided by way of illustration only, and are therefore not limitative of the present invention, and in which: FIG. a block diagram of a mobile terminal according to the present disclosure; Figures 2 and 3 are conceptual representations of an example of the mobile terminal, seen from different directions; Fig. 4 is a flowchart of an operation of a mobile terminal according to an embodiment of the present invention; Figs. 5A to 5C are diagrams of an example for describing an operation of a mobile terminal in the event that a selected subject or object included in a selected region deviates from a preview image; Figs. 6A and 6B are diagrams for describing an interrupt embodiment of a video recording and an embodiment of resume of the interrupted video recording; Figs. 7A and 7B are diagrams for describing the operation of a mobile terminal in case of appearance of a new object; Figs. 8A and 8B are diagrams for describing an interrupt embodiment of a video recording and a resume embodiment of the interrupted video recording; Fig. 9 is a diagram for describing an example of changing an angle of a camera lens by following a subject; Fig. 10 is a diagram for describing an example of tracking a subject by zooming in or out; Fig. 11 is a diagram for describing an example of a shooting process using a first camera and a second camera according to the present invention; Figs. 12A-12C are diagrams for describing another example of a shooting process using a first camera and a second camera according to the present invention; Fig. 13 is a diagram for describing an embodiment of taking a photograph having a composition similar to that of a photograph taken previously according to the present invention; Fig. 14 is a diagram for describing an embodiment of automatically taking a picture in case of entry of a new object into a defined region according to the present invention; and Fig. 15 is a diagram for describing another embodiment of automatically capturing a photo upon entry of a new object into a region defined according to the present invention. We will now make a detailed description according to embodiments disclosed herein, with reference to the accompanying drawings. For the purpose of a brief description with reference to the drawings, it is possible to assign to the same or equivalent components the same or similar reference numbers, and not to repeat their description. In general, a suffix such as "module" and "unit" can be used to designate elements or components. [0008] The use of such a suffix here is purely intended to facilitate the description of the Memory, and the suffix itself is not meant to provide any special meaning or function. In the present disclosure, what is well known to those skilled in the art has generally been omitted for the sake of brevity. The accompanying drawings are used to facilitate the understanding of various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed as extending to any modification, any equivalent and substitute in addition to those specifically defined in the accompanying drawings. It will be understood that well the terms first, second, etc. can be used here to describe different elements, these elements should not be limited by these terms. In general, these terms are only used to distinguish one element from another. It should be understood that when an element is designated as being "connected to" another element, the element may be connected to the other element or intermediate elements may also be present. On the contrary, when an element is designated as being "directly connected to" another element, no intermediate element is present. A singular representation may include a plural representation unless it has an absolutely different meaning in relation to the context. [0009] Terms such as "include" or "have" are used here and it must be understood that they are meant to indicate the existence of more than one component, function or step, disclosed in the Brief, and it is also understood that more or less components, functions or steps can also be used. The mobile terminals presented here can be implemented using a variety of different types of terminals. Examples of these devices include cell phones, smart phones, user devices, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable multimedia players (PMPs - Portable Multimedia Players), browsers, personal computers (PCs), electronic slates, digital tablets, ultra-books, portable devices (eg, smart watches, smart glasses, head-mounted displays), and the like . [0010] By way of nonlimiting example only, a more complete description will be made with reference to particular types of mobile terminals. However, such teachings also apply to other types of terminals, such as the aforementioned types. In addition, these teachings can also be applied to fixed terminals such as digital TV, desktops and the like. We will now refer to Figures 1-3, where Figure 1 is a block diagram of a mobile terminal according to the present disclosure, and Figures 2 and 3 of the conceptual representations of an example of the mobile terminal, seen from different directions. [0011] The mobile terminal 100 is shown with components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180 and a power supply unit 190. It should be understood that the use of all the illustrated components is not a requirement, and that it is also possible to use more or fewer components. If we now look at Figure 1, the mobile terminal 100 is shown as having a wireless communication unit 110 configured with a plurality of components commonly implemented. For example, the wireless communication unit 110 typically includes one or more components that enable wireless communication between the mobile terminal 100 and a wireless communication system or network within which the mobile terminal is located. The wireless communication unit 110 generally includes one or more modules that make it possible to establish communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 generally includes one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 includes one or more of the following: a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-term communication module scope 114 and a location information module 115. [0012] The input unit 120 includes a camera 121 for obtaining images or video, a microphone 122, which is a type of audio input device for inputting an audio signal, and a user input unit 123 (by for example, a touch key, a push button, a mechanical key, a programmable key and the like) to allow a user to enter information. Data (eg, audio, video, images, and the like) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user controls, and combinations thereof. The detection unit 140 is generally implemented by means of one or more sensors configured to detect the internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information, and the like. For example, in FIG. 1, the detection unit 140 is shown as having a proximity sensor 141 and a lighting sensor 142. If necessary, the detection unit 140 may alternatively or additionally include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyro sensor, a motion sensor, an RGB sensor, an infrared (IR) sensor, a sensor with digital scanning, an ultrasonic sensor, an optical sensor (for example, a camera 121), a microphone 122, a battery gauge, a room sensor (for example, a barometer, a hygrometer, a thermometer, a sensor radiation detector, a thermal sensor, and a gas sensor, among others), and a chemical sensor (e.g., an electronic nose, a medical sensor, a biometric sensor, and the like), to name a few. only a few. The mobile terminal 100 may be configured to use the information obtained by the detection unit 140, and in particular the information obtained from one or more sensors of the detection unit 140, and their combinations. The output unit 150 is generally configured to output various types of information, such as sound, video, touch output, and the like. The output unit 150 is shown with at least one of a display unit 151, an audio output module 152, a haptic module 153 and an optical output module 154. The display unit 151 may have a intermediate layer structure or integrated structure with a touch sensor to simplify a touch screen. [0013] The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as operate as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. The interface unit 160 serves as an interface to the various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, may include any of wired or wireless ports, external power ports, wired or wireless data ports, memory card ports, ports for connecting a device with an identification module, input / output (I / O) ports audio, video I / O ports, headphone ports, and the like In some cases, the mobile terminal 100 may perform an assortment of control functions associated with a connected external device, when the external device is connected to the d The memory 170 is generally implemented to store data to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store data. application programs executed in the mobile terminal 100, data or instructions relating to the operation of the mobile terminal 100, and the like. Some of these application programs can be downloaded from an external server by wireless transmission. Other application programs may be installed in the mobile terminal 100 at the time of manufacture or shipment, which is generally the case for the basic functions of the mobile terminal 100 (for example, receiving a call, establishing a call, receiving a message, sending a message, and the like). It is common for an application program to be stored in the memory 170, installed in the mobile terminal 100 and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. The controller 180 generally has the function of controlling the overall operation of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 can provide or process information or functions appropriate for a user by processing the signals, data, information and the like, which are input or output by the various components illustrated in Figure 1, or by activating the programs. For example, the controller 180 controls some or all of the components illustrated in FIGS. 1 to 3 according to the execution of an application program that has been stored in the memory 170. The power supply 190 may be configured to receive external power or provide internal power to provide the amount of power required to operate the elements and components contained in the mobile terminal 100. The power supply unit 190 may include a battery, and the battery can be configured to be integrated with the terminal body, or configured to be detachable from the body of the terminal rminal. Referring again to Figure 1, we will now describe various components illustrated in this figure in more detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is generally configured to receive a broadcast signal and / or broadcast related information from an external broadcast management entity via a broadcast channel. diffusion. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate the simultaneous reception of two or more broadcast channels, or to support switching between the broadcast channels. The mobile communication module 112 can transmit and / or receive wireless signals to and from one or more network entities. [0014] Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. These network entities are part of a mobile communication network, which is built according to technical standards or communication methods for mobile communications (for example, the Global System for Mobile Communication (GSM), Code Division Multiple Access (CDMA), 2000 Code Division Multiple Access (CDMA2000), enhanced enhanced voice and data services, or enhanced voice and data services only ( Enhanced Voice-Data Only (EV-DO), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), uplink packet access High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Advanced Long Term Evolution (LTE-A), and the like. The wireless signals transmitted and / or received via the mobile communication module 112 may include audio call signals, video call signals (telephony), or various data formats intended to allow the communication of text messages. multimedia. [0015] The wireless Internet module 113 is configured to facilitate wireless Internet access. This module can be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 can transmit and / or receive wireless signals over communication networks using wireless Internet technologies. [0016] Examples of this wireless Internet access include Wireless LAN (WLAN), Wireless Fidelity (Wi-Fi), Wi-Fi Direct, and Digital Living Network Alliance. DLNA - Digital Living Network Alliance), Wireless Broadband (WiBro) access, Worldwide Interoperability for Microwave Access (WiMAX), packet link access High-speed downlink (HSDPA), high-speed uplink packet access (HSUPA), long-term evolution (LTE), advanced long-term evolution (LTE-A), and the like. The wireless Internet module 113 may transmit and / or receive data according to one or more of these wireless Internet technologies, as well as other Internet technologies. In some embodiments, when wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A, and the like, as part of a mobile communication network, the wireless Internet module 113 provides such wireless Internet access. As such, the Internet module 113 may cooperate with, or serve as a mobile communication module 112. The short-range communication module 114 is configured to facilitate short-range communications. Technologies suitable for implementing these short-range communications include BLUETOOTHTm, Radio Frequency IDentification (RFID), IrDA (Infrared Data Association) Ultra-Wide Band (UWB), ZigBee, near field communications (NFC), wireless fidelity (Wi-Fi), Wi-Fi Direct, universal serial bus wire (Wireless USB - Universal Serial Bus), and the like. The short-range communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the terminal. mobile and a network where there is another mobile terminal 100 (or an external server), through wireless local area networks. An example of wireless LANs is wireless personal networks. In some embodiments, another mobile terminal (which may be configured in the same manner as the mobile terminal 100) may be a portable device, for example, a smart watch, smart glasses or a headset (HMD), which is able to exchange data with the mobile terminal 100 (or to cooperate in another way with the mobile terminal 100). The short-range communication module 114 may detect or recognize the portable device, and allow communication between the portable device and the mobile terminal 100. In addition, when the detected portable device is a device that has been authenticated to communicate with the mobile device mobile terminal 100, the controller 180, for example, can transmit processed data in the mobile terminal 100 to the portable device via the short-range communication module 114. As a result, a user of the portable device can use the processed data in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. In addition, when a message is received in the mobile terminal 100, the user can view the received message using the portable device. The location information module 115 is generally configured to detect, calculate, obtain or otherwise identify a position of the mobile terminal. For example, the location information module 115 includes a global positioning system (GPS) module, a Wi-Fi module, or both. If necessary, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal. For example, when the mobile terminal uses a GPS module, the position of the mobile terminal can be acquired through a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, the position of the mobile terminal can be acquired from information relating to a wireless access point (AP) that transmits or receives a wireless signal. to or from the Wi-Fi module. The input unit 120 may be configured to allow various types of inputs to the mobile terminal 100. Examples of this input include audio, picture, video, data, and user input. The image and video inputs are often obtained by means of one or more cameras 121. These cameras 121 can process still image or video image frames obtained by image sensors in video capture or capture mode. 'picture. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. In some cases, the cameras 121 may be arranged in a matrix configuration to allow input of a plurality of images. Images having various angles or focal lengths in the mobile terminal 100. For example still, the cameras 121 can be arranged in a stereoscopic arrangement to acquire left and right images allowing the implementation of a stereoscopic image. The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio data may be processed in a variety of ways according to a function performed in the mobile terminal 100. If necessary, the microphone 122 may include an assortment of audio signals. noise suppression algorithms to eliminate unwanted noise produced during the reception of external audio. The user input unit 123 is a component that allows entry 30 by a user. This user input may allow the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (for example, a key, a button located on a front and / or rear surface or on a side surface of the mobile terminal 100, a dome switch, a jog wheel, a jog scroller, and the like), or a touch input, among 'other. As an example, the touch input can be a virtual key or soft key, which is displayed on a touch screen by software processing, or a touch pad that is located on the handheld at a location other than the touch screen. touchscreen. On the other hand, the virtual key or the visual key can be displayed on the touch screen in various forms, for example, graphics, text, icon, video, or a combination thereof. The detection unit 140 is generally configured to detect one or more of the following information: internal information of the mobile terminal, information about the surrounding environment of the mobile terminal, user information, or the like. The controller 180 generally cooperates with the transmitter unit 140 to control the operation of the mobile terminal 100 or to execute a data processing, a function or an operation associated with an application program installed in the mobile terminal according to the detection provided by the mobile terminal. detection unit 140. The detection unit 140 may be implemented by any of a variety of sensors, some of which will now be described in more detail. The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface, or an object located near a surface, using an electromagnetic field, infrared rays, or similar without mechanical contact. The proximity sensor 141 may be arranged in an internal region of the mobile terminal covered by the touch screen, or near the touch screen. [0017] The proximity sensor 141 may include, for example, any one of the following sensors: a transmissive type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror direct reflective type photoelectric sensor, a high oscillation proximity sensor. frequency, a capacitive type proximity sensor, a magnetic type proximity sensor, an infrared proximity sensor, and the like. When the touch screen implemented is of the capacitive type, the proximity sensor 141 can detect the proximity of a pointer relative to the touch screen by the variations of an electromagnetic field, which reacts to the approach of an object without conductivity. In this case, the touch screen (touch sensor) can also be classified in the category of a proximity sensor. The term "proximity touch" will be frequently used herein to refer to a scenario in which a pointer is placed so as to be near the touch screen without touching the touch screen. The term "contact touch" will be frequently used herein to refer to a scenario in which a pointer is physically in contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, this position will correspond to a position in which the pointer is perpendicular to the touch screen. The proximity sensor 141 can detect a proximity touch, and proximity touch modes (e.g., distance, direction, velocity, time, position, motion state, and the like). In general, the controller 180 processes data corresponding to proximity and proximity touch modes detected by the proximity sensor 141, and displays visual information on the touch screen. In addition, the controller 180 can control the mobile terminal 100 to perform different operations or to process different data depending on whether the touch relative to a point of the touch screen is a proximity touch or a touch contact. A touch sensor can detect a touch applied to the touch screen, such as the display unit 151, by any of a variety of touching methods. Examples of such touching methods include resistive type, capacitive type, infrared type, and magnetic field type, among others. As an example, the touch sensor may be configured to convert variations of a pressure applied to a specific portion of the display unit 151, or to convert a capacitance appearing in a specific portion of the unit of operation. display 151, as electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also tactile pressure and / or tactile ability. A touch object is generally used to apply a touch input to the touch sensor. As examples of conventional tactile objects, there may be mentioned a finger, a stylus, a pointer, or the like. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals, and then transmit the corresponding data to the controller 180. Thus, the controller 180 can detect the region of the display unit 151 that has been touched. Here, the touch controller may be an independent component of controller 180, controller 180, and combinations thereof. [0018] In some embodiments, the controller 180 may execute the same or different commands depending on the type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. The execution or otherwise of an identical or different command depending on the object that provides a touch input can be decided on the basis of the current operating status of the mobile terminal 100 or an application program in progress. execution, for example. The touch sensor and the proximity sensor can be used individually, or in combination, to detect various types of touch. These touches include a short touch (or tapping), a long touch, a multi-touch, a slip-like feel, a touch-like touch, a pinch-like touch, a pinch-like touch, a touch by touch, touch by flyover, and the like. If necessary, an ultrasonic sensor may be implemented to recognize positional information relating to a touch object by means of ultrasonic waves. The controller 180 may, for example, calculate the position of a wave emission source from information detected by a light sensor and a plurality of ultrasonic sensors. Because light is much faster than ultrasonic waves, the time taken by light to reach the optical sensor is much shorter than the time taken by the ultrasonic wave to reach the ultrasonic sensor. The position of the wave emission source can be calculated on the basis of this fact. For example, the position of the wave emission source can be calculated using the time difference from when the ultrasonic wave reaches the sensor and using light as a reference signal. [0019] The camera 121 generally includes at least one camera sensor (CCD, CMOS etc.), a photodetector (or image sensors), and a laser sensor. The implementation of the camera 121 with a laser sensor can allow the detection of the touch of a physical object with respect to a stereoscopic image in 3D. The photodetector may be superimposed on or overlapped with the display device. The photodetector may be configured to scan the movement of the physical object near the touch screen. To enter into details, the photodetector may include photodiodes and row and column transistors for scanning content received in the photodetector by means of an electrical signal that varies depending on the amount of light applied. In other words, the photodetector can calculate the coordinates of the physical object according to a variation of light in order to obtain information on the position of the physical object. [0020] The display unit 151 is generally configured to deliver processed information in the mobile terminal 100. For example, the display unit 151 may display the information of an execution screen of an application program currently being processed. execution in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the information of the execution screen. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A conventional stereoscopic display unit may employ a stereoscopic display system such as a stereoscopic system (goggle system), an auto-stereoscopic system (goggleless system), a projection system (holographic system), or the like . The audio output module 152 is generally configured to produce audio data. This audio data can be obtained from any one of several different sources, so that the audio data can be received from the wireless communication unit 110 or may have been stored in the memory 170. The audio data can be be produced during modes such as a signal receiving mode, a calling mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like. The audio output module 152 may provide an audible output relating to a particular function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. The module audio output 152 can also be implemented in the form of a receiver, a speaker, a buzzer, or the like. A haptic module 153 may be configured to produce various tactile effects that a user feels, perceives, or otherwise feels. The classic example of a tactile effect produced by the haptic module 153 is that of vibrations. The intensity, pattern, and the like of a vibration produced by the haptic module 153 may be controlled by a user selection or parameterization by the controller. For example, the haptic module 153 may emit different vibrations in a combined manner or in a sequential manner. [0021] In addition to the vibrations, the haptic module 153 can produce various other tactile effects, such as a stimulation effect such as a vertically movable pin arrangement for touching the skin, a spraying force, or an air suction force through an orifice. jet or suction opening, a touch of the skin, the contact of an electrode, an electrostatic force, an effect of reproducing the sensation of heat or cold with the aid of an element capable of absorbing or to produce heat, and the like. The haptic module 153 can also be implemented to allow the user to feel a tactile effect through a muscular sensation such as the fingers or the arm of the user, as well as to transmit the tactile effect through 'a direct contact. Two or more haptic modules 153 may be provided depending on the particular configuration of the mobile terminal 100. An optical output module 154 may emit a signal to indicate the creation of an event by light from a light source. Examples of events created in the mobile terminal 100 include receiving a message, receiving a call signal, a missed call, an alarm, a calendar warning, receiving a electronic message, receipt of information through an application, and the like. A signal emitted by the optical output module 154 may be implemented in such a way that the mobile terminal emits a monochromatic light or a light having a plurality of colors. The transmitted signal may end when the mobile terminal detects that a user has consulted the created event, for example. [0022] The interface unit 160 serves as an interface for connecting external peripherals to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted by an external device, receive energy to be transmitted to elements and components of the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headphone ports, external power ports, wired or wireless data ports, memory card ports, ports for connecting a device with an identification module, audio input / output (I / O) ports, video I / O ports, headphone ports, or the like. The identification module may be a chip that stores various information for authenticating the use authority of the mobile terminal 100 and may include a user identity module (UIM), a module of subscriber identity module (SIM), a universal subscriber identity module (USIM), and the like. In addition, the device with the identification module (also called here "identification device") can take the form of a smart card. As a result, the identification device can be connected to the terminal 100 via the interface unit 160. [0023] When the mobile terminal 100 is connected to an external cradle, the interface unit 160 can serve as a passageway to allow the energy from the cradle to feed the mobile terminal 100 or can be used as a gateway to allow various cradle signals to reach the cradle. command entered by the user from the cradle to be transmitted to the mobile terminal through the latter. Various control signals or a power input from the cradle may act as signals to recognize that the mobile terminal is properly mounted on the cradle. The memory 170 may store programs to operate the controller 180 and store input and / or output data (e.g., a directory, messages, still images, videos, etc.). The memory 170 can store data relating to the various vibration and sound patterns that are emitted in response to touch inputs on the touch screen. The memory 170 may include one or more types of storage media, including a flash memory, a hard disk, a solid state disk, a silicon disk, a media card micro type, a card type memory (e.g. SD or DX memory, etc.), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), reprogrammable read-only memory (ROM) electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. The mobile terminal 100 can also be used in conjunction with a network storage device that provides the storage function of the memory 170 on a network, such as the Internet. The controller 180 can generally control the general operations of the mobile terminal 100. For example, the controller 180 can set or delete a lockout state to prevent a user from entering a control command by relating to applications when the terminal state mobile fulfills a predefined condition. The controller 180 may also perform the controls and processes associated with voice calls, data transmissions, video calls, and the like, or perform pattern recognition processing to recognize a handwritten input or an image-like input. performed on the touch screen in the form of characters, or images, respectively. In addition, controller 180 may control one or a combination of these components to implement various exemplary embodiments disclosed herein. The power supply unit 190 receives external energy or provides internal energy and provides the amount of energy required to operate the respective elements and components contained in the mobile terminal 100. The power supply unit 190 may include a battery, which is generally rechargeable or which is releasably coupled to the terminal body for recharging. The power supply unit 190 may include a connection port. The connection port may be configured as an example of the interface unit 160 to which an external charger is electrically connected to provide power for recharging the battery. As another example, the power supply unit 190 may be configured to recharge the battery in a non-wired manner without using the connection port. [0024] In this example, the power supply unit 190 can receive energy, transmitted by an external wireless energy transmitter, using at least one of an inductive coupling method based on magnetic induction or a resonance method. magnetic which relies on electromagnetic resonance. [0025] Various embodiments described herein may be implemented on a computer readable medium, a machine readable medium, or a similar medium by, for example, software, hardware, or combination. any of these. Referring now to Figures 2 and 3, the mobile terminal 100 is described with reference to a bar-type terminal body. However, the mobile terminal 100 may also be implemented in any one of a number of different configurations. Examples of such configurations are the watch type, the fastener type, the spectacle type, or a flap type, a flapper type, a slide type, a tilt type, and a swivel type in which two or more bodies are associated with each other in a relatively mobile manner, and their combinations. In the present, the discussion will often relate to a particular type of mobile terminal (eg, bar type, watch type, eyeglass type, and the like). However, such teachings relating to a particular type of mobile terminal generally apply to other types of mobile terminals. Mobile terminal 100 generally includes a housing (eg, chassis, housing, cache, and the like) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are integrated in a space formed between the front housing 101 and the rear housing 102. At least one central housing can be further positioned between the front housing 101 and the rear housing 102. The display unit 151 is shown as being on the front side of the terminal body for providing information. As seen, a window 151 of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, electronic components can also be mounted on the rear housing 102. Examples of these electronic components include a removable battery 191, an identification module, a memory card, and the like. The back cover 103 is shown as covering the electronic components, and this cover can be detachably coupled to the back box 102. Therefore, when the back cover 103 is detached from the back box 102, the electronic components mounted to the back box 102 are exposed to the outdoors. [0026] As seen, when the back cap 103 is coupled to the back box 102, a side surface of the back box 102 is partially exposed. In some cases, during the coupling, the rear case 102 may also be completely protected by the back cover 103. In some embodiments, the back cover 103 may include an opening for exposing a camera 122 or a camera to the outside. audio output module 152b. The housings 101, 102, 103 may be formed by injection molding of a synthetic resin or may be made of a metal, for example, stainless steel, aluminum (Al), titanium (Ti), or similar. As an alternative to the example in which the plurality of housings form an internal space for housing components, the mobile terminal 100 may be configured such that a single housing forms the internal space. In this example, a mobile terminal 100 having a monobody is formed in such a way that a synthetic resin or metal extends from a side surface to a back surface. If necessary, the mobile terminal 100 may include a sealed unit (not shown) for preventing water from entering the body of the terminal. For example, the sealed unit may include a sealing member that is located between the window 151 and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the back cover 103, to seal an internal space when these boxes are coupled. Figures 2 and 3 illustrate some components as arranged on the mobile terminal. However, it should be understood that alternative arrangements are possible and within the scope of the teachings of the present disclosure. Some components may be deleted or rearranged. For example, the first handling unit 123a may be located on another surface of the terminal body, and the second audio output module 152b may be located on a side surface of the terminal body. [0027] The display unit 151 outputs processed information in the mobile terminal 100. The display unit 151 may be implemented by means of one or more suitable display devices. Examples of such suitable display devices include a Liquid Crystal Display (LCD), a Thin Film Transistor Liquid Crystal Display (TFT-LCD), a diode organic light emitting diode (OLED), a flexible display, a three-dimensional (3D) display, an electronic ink display, and combinations thereof. [0028] The display unit 151 can be implemented by means of two display devices, which can implement the same or different display technologies. For example, a plurality of display units 151 may be arranged on one side, spaced from each other, or these devices may be integrated, or these devices may be arranged on different surfaces. [0029] The display unit 151 may also include a touch sensor that detects a touch input received in the display unit. When a touch is input to the display unit 151, the touch sensor may be configured to detect that touch and the controller 180 may, for example, produce a control command or other signal corresponding to the touch. Content that is entered in a tactile manner can be a text or numerical value, or a menu item that can be indicated or designated according to various modes. The touch sensor may be configured as a film having a touch pattern, disposed between the window 151 and a display on a rear surface of the window 151, or a wire whose pattern is printed directly on the back surface. Alternatively, the touch sensor may be formed integrally with the display. For example, the touch sensor may be disposed on a substrate of the display or in the display. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure 1). Therefore, the touch screen can replace at least a portion of the functions of the first handling unit 123a. [0030] The first audio output module 152a may be implemented as a speaker for transmitting audio voices, alarm sounds, multimedia audio reproduction, and the like. The window 151 of the display unit 151 generally includes an aperture 5 for allowing audio content produced by the first audio output module 152a to pass. A first solution is to allow the release of audio content along an assembly gap between the structural bodies (for example, a gap between the window 151 and the front housing 101). In this case, an independently formed hole for emitting audio sounds may not be seen or otherwise obscured in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The output module Optical 154 may be configured to emit light to indicate the creation of an event. Examples of such events include receiving a message, receiving a call signal, a missed call, an alarm, a calendar notification, receiving an electronic message, receiving information through an application, and the like. When a user has viewed a created event, the controller may control the optical output unit 154 to stop the light emission. The first camera 121 may process image frames such as still or moving images obtained by the image sensor in capture mode or in video call mode. The processed image frames can then be displayed on the display unit 151 or stored in the memory 170. The first and second manipulation units 123a and 123b are examples of the user input unit 123, which can be manipulated by a user to provide an input to the mobile terminal 100. The first and second handling units 123a and 123b may also be commonly referred to as a manipulative part, and may employ any tactile method that allows the user to perform an operation. manipulation such as touching, pressing, scrolling, or the like. The first and second handling units 123a and 123b may also employ any non-touch method that allows the user to perform manipulation such as touch proximity, overflight, or the like. [0031] Figure 2 shows the first handling unit 123a as a touch key, but among the possible alternatives are a mechanical key, a key to be pressed, a touch key and their combinations. An input received by the first and second handling units 123a and 123b can be used in various ways. For example, the first handling unit 123a may be used by the user to provide menu entry, a home key, a cancel, a search, or the like, and the second handling unit 123b may be used by the user. user to provide input to control the sound volume output from the first or second audio output modules 152a or 152b, to switch to a touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit (not shown) may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide a start and / or stop input, start, end, scroll, to control the sound volume emitted by the first or second 152a or 152b audio output modules, switch to a touch recognition mode of the display unit 151, and the like. The rear input unit may be configured to allow touch input, one-touch input, or combinations thereof. The rear input unit may be located so as to overlap the display unit 151 on the front side in the thickness direction of the terminal body. As an example, the rear input unit may be located on an upper end portion of the back side of the terminal body so that a user can easily manipulate it using the index when the user enters the terminal body with one hand. Alternatively, the rear input unit can be positioned almost anywhere on the back side of the terminal body. Embodiments that include the rear input unit may implement some or all of the functionality of the first handling unit 123a in the rear input unit. Thus, in cases where the first handling unit 123a is removed from the front side, the display unit 151 may have a larger screen. [0032] As another possibility, the mobile terminal 100 may include a digital scanning sensor which will scan the fingerprint of the user. The controller 180 can then use the fingerprint information detected by the digital scanning sensor as part of an authentication procedure. The digital scanning sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The microphone 122 is shown as being located at one end of the mobile terminal 100, but other locations are possible. If necessary, several microphones can be implemented, according to an arrangement that allows the reception of stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to communicate with external devices. For example, the interface unit 160 may include one or more of the following: a connection terminal enabling it to connect to another device (for example, a headset, an external speaker, or the like), a port for near-field communications (for example, an infrared data association (IrDA) port, a Bluetooth port, a wireless LAN port, and the like), or a power terminal for powering the mobile terminal 100. The interface unit 160 may be implemented in the form of a receptacle intended to receive an external card, for example a subscriber identification module (SIM), a subscriber identification module. User Identity Module (UIIVI), or a memory card for storing information. The second camera 122 is shown as being located on the rear side of the terminal body and includes an image capture direction which is substantially opposite to the image capture direction of the first camera unit 121. If necessary, a second camera 121 may alternatively be located at other locations, or made to be movable, to have an image capture direction different from that shown. The second camera 122 may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. The cameras can be called "network camera". When the second camera 122 is implemented as a network camera, images can be captured in a variety of ways by means of the plurality of lenses and produce images of better quality. As shown in Fig. 3, a flash 124 is shown next to the second camera 122. When an image of a subject is captured with the camera 122, the flash 124 may illuminate the subject. As shown in Figure 2, the second audio output module 152b may be located on the body of the terminal. The second audio output module 152b may implement stereophonic sound functions together with the first audio output module 152a, and may also be used to implement a speakerphone mode for voice communications. At least one wireless transmission antenna may be located on the body of the terminal. The antenna can be installed in the body of the terminal or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the body of the terminal. Alternatively, an antenna may be formed by means of a film attached to an inner surface of the back cover 103, or a housing that includes a conductive material. A power supply unit 190 for powering the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or removably coupled outside the terminal body. The battery 191 may receive power via a power cable connected to the interface unit 160. In addition, the battery 191 can be recharged in a non-wired manner by means of a wireless charger. Wireless recharging can be implemented by magnetic induction or electromagnetic resonance. The back cover 103 is shown coupled to the back box 102 to protect the battery 191, to prevent separation of the battery 191, and to protect the battery 191 from external shock or foreign matter. When the battery 191 is detachable from the terminal body, the rear case 103 may be detachably coupled to the rear case 102. An accessory to protect the appearance of the mobile terminal 100 or to facilitate or complete its operation may also be provided. on the mobile terminal 100. To give an example of an accessory, a cover or cover for covering or receiving at least one surface of the mobile terminal 100 may be provided. The cover or the cover can cooperate with the display unit 151 to complete the operation of the mobile terminal 100. Another example of an accessory that may be mentioned is a stylus that facilitates a tactile input or transmits it to a screen touch. In this Memoir, a preview image may include an image inputted through the camera 121 of the mobile terminal 100 by manipulation of a user. In addition, since a preview image is displayed through the display unit 151, a user can preview a picture to be taken before taking the picture or video to record before recording the video. While a preview image is displayed, if a shooting instruction is inputted, the controller 180 may take a picture or record a video. In this memoir, a subject may mean a photography target by which a user is interested. A subject can be defined by a character, an animal, a thing or the like. It goes without saying that a subject can be defined by a face, a part of a body, a part of a thing or the like. In the present Memoir, an object may mean a character, animal, thing or the like included in a user-defined area. It goes without saying that a face, a part of a body, a part of a thing or the like can mean an object. The "camera" terminology used in the following embodiments may indicate at least one of the front camera 121a exposed to a front side of the mobile terminal 100 and the rear camera 121b exposed to a back side of the mobile terminal 100. However, in describing the embodiments for which the front camera 121a and the rear camera 121b are to be distinguished from each other, "first camera" terminology and "second camera" terminology can be used. In particular, each of the first camera and the second camera may indicate the front camera 121a or the rear camera 121b. In the following description, we will explain embodiments of a control method implemented in the mobile terminal configured above with reference to the accompanying drawings. Fig. 4 is a flowchart of an operation of a mobile terminal 30 according to an embodiment of the present invention. Referring to FIG. 4, in which, if the camera 121 is activated by manipulation of a user, the controller 180 can control the display of a preview image, which is inputted through the camera 121, on the display unit 151 [S401]. While the preview image is displayed, a determined subject is selected from the preview image or a specified region can be set on the preview image [S402]. In this case, if the determined subject or a specific object included in the determined region deviates from the preview image or if the determined subject or object is obscured by another object, the controller 180 can control producing an alarm [S403]. The mobile terminal 100 according to the present invention is described in detail with reference to the accompanying drawings in the following manner. [0033] Figs. 5A to 5C are diagrams of an example for illustrating an operation of a mobile terminal in a case where a selected subject or an object included in a selected region deviates from a preview image. In particular, Fig. 5A is a diagram for illustrating operation in the case where a selected subject deviates from a preview image. And Fig. 5B is a diagram for illustrating operation in a case where an object included in a selected region deviates from a preview image. Refer to FIG. 5A (a), on which, based on a user's touch input in a preview image, the controller 180 can select a specific subject within the preview image. In the example shown in Fig. 5A (a), a user input of touching a surfer 510 is received. Thus, the controller 180 can control the visual identification of the selected subject within the preview image. In the example shown in Figure 5A (a), to identify a surfer touched by a user, a dashed outline 515 is displayed on the surfer. However, a method of visually identifying a selected subject is not limited by the method shown in Figure 5A (a). Therefore, it is capable of controlling the visual identification of a selected subject using a method different from that shown in Fig. 5A (a). According to the example shown in Fig. 5A (a), a user input for selecting a subject is an input of touching a subject on a preview image, to which user input for selecting a subject is not limited. . For example, using a separate menu entry, an input of a non-programmable key manipulation, or the like, it is possible to select a preview image subject. If a subject is selected, the controller 180 can detect a change of subject such as a move, an enlargement, a reduction, or the like. The controller 180 may be capable of automatically selecting a subject on a preview image. For example, if an object such as a character, an animal, an article of recognizable shape, or the like is discovered from a preview image, the controller 180 can select the discovered object as a subject. Referring to Figure 5A (b), where, when the subject 510 selected from the preview image is shifted to the right, as in the example shown in Figure 5A (b), if at least a part of the subject 510 deviates from the preview image, the controller 180 can control the production of an alarm via the mobile terminal 100. In this case, the alarm can include at least one a sound produced by the audio output unit 152, a vibration produced through the haptic module 153 and a light emanating from a light source (e.g., LEDs, etc.) . Thus, the controller 180 can determine a center of the alarm produced following a direction in which the subject 510 disappears. For example, when the generated alarm is a vibration 520, as in the example shown in Figure 5A (b), if the subject 510 deviates to the right of the preview image, the controller 180 can control the producing a stronger vibration 520 from a right side of the mobile terminal 100. In the case where the alarm is an audio type, the controller 180 can control the production of a sound from a right side of the mobile terminal 100 only [ that is, a sound is produced by a speaker located on the right side only]. Alternatively, the controller 180 can control the production of a sound so that a sound produced from a right side is louder than that produced from a left side. In the case where an alarm is of the light type, the controller 180 can control the production of light from a right side of the mobile terminal 100 only. Alternatively, the controller 180 can control the production of light such that light produced from a right side is stronger than that produced from a left side. [0034] In addition, the controller 180 may be adapted to adjust a force or count of an alarm output according to the degree of spacing of the subject 510 from the preview image. Once the subject 510 has deviated completely from the preview image, as in the example shown in Fig. 5A (c), if the subject reappears on the preview image, the controller 180 may control the generation of an alarm via the mobile terminal 100. The alarm generated in response to the reappearance of the subject 510 on the preview image may have a different model from that of the previous case where the at least part of the topic 510 10 deviates from the preview image. In the example shown in Fig. 5A, only one subject is selected in response to a user's touch input. In addition, the controller 180 may be able to select a plurality of subjects from a preview image. Thus, if a determined subject of a plurality of subjects deviates from the preview image or if all subjects deviate from the preview image, the controller 180 can control the production of an alarm. Referring to Figure 5B (a), on the basis of a user's touch input in a preview image, the controller 180 can select a specific region within the image of the image. preview. For example, if a touch input of moving a pointer following a trace of a closed curve on the preview image is received from the user, the controller 180 can select a region in a space formed by the closed curve as the specific region 520. According to the example shown in FIG. 5B (a), a user input for moving a pointer following a trace surrounding a circumference of a surfer 510 is selected. Thus, as mentioned in the foregoing description with reference to FIG. 5A, the controller 180 can control the visual identification of the selected region. According to the example shown in Fig. 5B (a), a user input to define a specific region is an input of a touch to move a closed curve with a pointer to a preview image, to which a user input to define a specific region is not limited. For example, using a separate menu entry, an entry of a membrane key operation, or the like, it is possible to define a specific region of a preview image. If the specific region is set, the controller 180 can detect a change of an object, which is included in the specific region, such as a displacement, an enlargement, a reduction, or the like. According to the example shown in FIG. 5A (a), the surfer 510 is included in the defined specific region 520. Therefore, the controller 180 may be able to detect a change of the surfer 510 such as a displacement, an enlargement, a reduction or the like. When the object 510 included in the defined region moves to the right, as in the example shown in Fig. 5B (b), if at least a portion of the object 510 deviates from the image of In the preview, the controller 180 may control an alarm to be generated via the mobile terminal 100. In this case, the alarm may include at least one of a sound produced through the output unit. audio 152, a vibration produced through the haptic module 153 and a light emanating from a light source (eg, LED, etc.). In addition, as mentioned in the foregoing description with reference to FIG. 5A, a center of an alarm output may be set in a direction in which the object 510 disappears or a force or count of alarm output can be adjusted according to the degree of deviation of the object 510 from the preview image. Once the object 510 has deviated completely from the preview image, as in the example shown in Fig. 5B (c), if the object 510 reappears on the preview image, the controller 180 may control the generation of an alarm via the mobile terminal 100. The alarm generated in response to the reappearance of the object 510 on the preview image may have a different model from that of the previous case where the at least a portion of the object 10 deviates from the preview image. According to the example shown in FIG. 5B, if the object 510 included in the defined region 520 deviates from the preview image, the alarm is generated. For another example, if the object 510 included in the defined region 520 deviates from the preview image, the controller 180 can control the production of the alarm. [0035] For example, Figure 5C is a diagram for illustrating an operation of a mobile terminal if an object included in a defined region deviates from the defined region. Referring to Figure 5C (a), in which a region 520 including a surfer 510 is defined by a touch input of a user for example. Subsequently, when the object 510 included in the defined region 520 moves to the right, as in the example shown in Fig. 5C (b), if the object 510 deviates from the defined region 520 , the controller 180 can control the production of an alarm via the mobile terminal 100. [0036] Once the object 510 has completely moved away from the defined region 520, as in the example shown in FIG. 5C (c), if the object 510 reappears in the defined region 520, the controller 180 can control the generation of an alarm via the mobile terminal 100. The alarm generated in response to the reappearance of the object 510 in the defined region 520 may be of a type or model different from that of the previous case where the at least a part of the object 510 deviates from the defined region 520. According to the examples shown in FIG. 5B and Figure 5C, only one region is defined in response to a user's touch input. In addition, the controller 180 may be capable of defining a plurality of regions on a preview image. [0037] In this case, if at least one of the objects included in each of the regions deviates from the preview image or if each of the regions or all the objects included in each of the regions deviates from the preview image or at each of the regions, the controller 180 may produce an alarm. However, if a subject does not deviate from a preview image for a predefined time, or if an object within a defined region does not deviate from a preview image or region, the controller 180 can order that a photo or video be taken automatically. However, even though the above condition is met, if a new object appears in the preview image or the defined region and the new object obscures the subject or object included in the defined region, a photo or video may not to be taken. In this case, the preset time can be determined by a user or manufacturer setting. [0038] The embodiments described with reference to FIGS. 5A to 5C may be applied to a case of making a video. In particular, during the making of a video, if at least a part of a previously selected subject or at least a part of an object included in a predefined region deviates from a photographed image or the predefined region, the controller 180 can control the production of an alarm via the mobile terminal 100. In addition, during the production of a video, if at least a part of a subject or at least a part of an object s' away from a photographed image or predefined region, the controller 180 can control the interruption of the video recording. This is described in detail with reference to Figure 6A and Figure 6B as follows. Fig. 6A and Fig. 6B are diagrams for illustrating an interrupt embodiment of a video recording and a resume embodiment of the interrupted video recording. [0039] Referring to Figure 6A (a), on which, based on a touch input on a preview image, the controller 180 can select a specific subject 610 on the preview image. In the example shown in Fig. 6A (a), a user input to touch a surfer 610 is received. Subsequently, if a user input for video capture is received, refer to Figure 6A (b), where the controller 180 records an input image and is also capable of controlling the production of a photographed image. through the display unit 151. Thus, when the selected subject moves before the video shoot, if at least part of the subject 610 deviates from a photographed image, let us refer to the figure 6A (c), on which the controller 180 can control the interruption of a recording of a video. When the controller 180 interrupts recording of the video, the controller 180 may also control the production of an alarm, which indicates that the subject 610 has departed from the photographed image. [0040] Subsequently, if the subject 610 reappears on the photographed image, let us refer to FIG. 6A (d), on which the controller 180 can resume recording the video. Thus, the controller 180 can control the production of an alarm, to indicate that the video recording has resumed. [0041] Referring to Figure 6B (a), on which, based on a touch input on a preview image, the controller 180 can define a specific region 620 on the preview image. In the example shown in Figure 6B (a), the region 620 is defined to surround a surfer 610. Thereafter, if a user input to take video views is received, refer to Figure 6B (b), on which the controller 180 records an image input through the camera 121 and is also capable of controlling the production of a photographed image through the display unit 151. Thus, if the object 610 included in the defined region 620 moves before video shooting, if at least a portion of the object 610 deviates from a photographed image, refer to Figure 6B (c), in which the controller 180 can control the interruption of a recording of a video. When the controller 180 interrupts the video recording, the controller 180 may also control the production of an alarm to indicate that the object 610 has departed from the photographed image. Subsequently, if the object 610 reappears on the photographed image, refer to Figure 6B (d), on which the controller 180 can resume video recording. Thus, the controller 180 can control the production of an alarm, to indicate that the video recording has resumed. [0042] Further, if the object 610 included in the defined region deviates from the defined region 620, the controller 180 interrupts the video recording. Subsequently, if the object 610 reappears in the defined region 620, the controller 180 can resume video recording [not shown in the drawings]. According to the examples shown in Fig. 6A and Fig. 6B, by applying touch input to a preview region before video shooting, it is able to select a subject or define a specific region. Contrary to the examples shown in Figure 6A and Figure 6B, during video shooting, the controller 180 may be able to select a subject or define a specific region. [0043] In the following description, when a new object appears on a preview image or a defined region, operation of a terminal is explained in detail with reference to the accompanying drawings. [0044] Figs. 7A and 7B are diagrams for illustrating an operation of a mobile terminal in case of appearance of a new object. Referring to 7A, on which two characters 710 and 720 are selected as subjects for example [Figure 7A (a)]. [0045] If a new object (for example, a football) 730 appears on a preview image [Figure 7A (b)] or if a new object 730 appearing on a preview image obscures at least a part (for example, the character indicated by a reference number 720 is obscured in FIG. 7A) of the selected subjects [FIG. 7A (c)], the controller 180 may control the generation of an alarm via the mobile terminal 100. In this case, the alarm may include at least one of a sound produced through the sound output unit 152, a vibration produced through the haptic module 153 and a light emanating from a light source (eg LED, etc.). Thus, the controller 180 can determine a center of the alarm generated following a direction in which the new object 730 moves. For example, when the generated alarm is a vibration, if the new object 730 moves to the right of the preview image from the left, the controller 180 commands that a vibration stronger than that of a right side is generated from a left side of the mobile terminal 100. Thereafter, if the new object 730 moves to a right side of the preview image, the controller 180 can control the progressive reduction of the vibration force deviation between the side. left and right side. In addition, the controller 180 may be able to adjust a force or value of an alarm output depending on the magnitude of the number or size of the new object (s) 730 appearing on the controller. preview image. Referring to Figure 7B, where two characters 710 and 720 are included in a defined region 740 for example [Figure 7B (a)]. If a new object 730 appears in a preview image [Fig. 7B (b)] or if a new object 730 appearing on a preview image obscures at least a portion of the objects included in the defined region 740 [for example, the indicated character by a reference number 720 is obscured in Figure 7B (b)] [Figure 7B (c)], the controller 180 can control the production of an alarm through the mobile terminal 100. [0046] Referring to Figure 7B (d), in which, if a new object 730 appears in the defined region 740, the controller 180 can control the generation of an alarm through the mobile terminal 100. The described embodiments with reference to Figure 7A and Figure 7B may be applied to a case of making a video. In particular, while a video is being made, if a new object appears on a photographed image or if an object recently appeared on a photographed image obscures a previously selected subject or an object included in a predefined region, the controller 180 may control the image. generating an alarm via the mobile terminal 100. Moreover, while a video is being made, if a new object appears on a photographed image or if an object recently appeared on a photographed image obscures a previously selected subject or object included in a predefined region, the controller 180 can control the interruption of the video recording. This is described in detail with reference to Fig. 8A and Fig. 8B as follows. Fig. 8A and Fig. 8B are diagrams for illustrating an interrupt embodiment of a video recording and a resume embodiment of the interrupted video recording. Referring to Figure 8A (a), on which the controller 180 can select a specific subject on the preview image. According to the example shown in Figure 8A (a), two characters 810 and 820 are selected as subjects. Next, let us refer to Figure 8A (b), where, if a user input for video shooting is received, the controller 180 records an input image through the camera 121 and is also able to control the image. production of an image photographed via the display unit 151. Then, if a new object 830 appears on the photographed image or if the object recently appeared 830 occult at least one of the subjects 810 and 820, refer- we in Figure 8A (c), on which the controller 180 can control the interruption of a recording of a video. When the controller 180 interrupts the video recording, the controller 180 may also control the production of an alarm to indicate the appearance of the new object 830. Next, refer to Figure 8A (d), on which, if the newly appeared object 830 disappears from the photographed image or does not obscure the corresponding subject (s), the controller 180 may be able to resume video recording. Thus, the controller 180 can control the production of an alarm, indicating that the recording has resumed. Referring to Figure 8B (a), on which, based on a touch input on a preview image, the controller 180 may define a specific region on the preview image. According to the example shown in Figure 8B (a), the region 840 is defined to surround two characters 810 and 820. Thereafter, if a user input of a video shoot is received, let us refer to FIG. 8B (b), in which the controller 180 records an image inputted through the camera 121 and is also capable of controlling the production of a photographed image through the display unit 151. Next, if a new object 830 appears on the photographed image, the newly appeared object 830 obscures the object 810 or 820 included in the defined region 840, or the new object 830 appears in the defined region 840, refer to FIG. 8B (c), on which the controller 180 can control the interruption of a recording of a video. When the controller 180 interrupts the video recording, the controller 180 may also control the production of an alarm to indicate the appearance of the new object 830. Thereafter, if the newly appeared object 830 disappears from the image When photographed, the newly-appeared object 830 does not obscure the object included in the defined region 840, or if the newly-appeared object 830 deviates from the defined region 840, refer to FIG. 8B (d), on which the controller 180 can resume video recording. Thus, the controller 180 can control the production of an alarm to indicate that the video recording has resumed. According to the examples shown in Fig. 8A and Fig. 8B, by applying a touch input to a preview region before video shooting, it is capable of selecting a subject or defining a specific region. In contrast to the examples shown in Fig. 8A and Fig. 8B, during video capture, the controller 180 may be able to select a subject or define a specific region. According to the description with reference to Fig. 8A and Fig. 8B, a current recording case of a video through the mobile terminal 100 is taken as an example. However, the present embodiment is similarly applicable to a case where a user is to photograph a still image. For example, although the mobile terminal 100 receives a shooting instruction, if a new object appears on a preview image, or if the newly-occured object occults a subject or object included in a defined region, it is able to 5 command that a shot is not made. If the recently appeared object deviates from the preview image or if the recently appeared object no longer manages to hide the subject or object included in the defined region, the controller 180 can take the shot in response to the received shooting instruction. The mobile terminal 100 according to the present invention moves an objective of the camera 121 following a subject or an object included in a specific region or adjusts a zoom magnification. When the controller 180 is not able to further move the camera lens 121 or is unable to further adjust the zoom, if the selected subject or object included in the specific region deviates by relative to the preview image, the mobile terminal 100 can control the production of an alarm. For example, Fig. 9 is a diagram for illustrating an example of changing an angle of a camera lens by following a subject. For the sake of clarity of the following description, assume that a selected subject (or an object included in a defined region) is a skier 910. Referring to Figure 9, on which, if a skier 910 moves the right to the left [FIG. 9 (a)], the controller 180 can control maintaining the entry of an image of the skier 910 by rotating a lens of the camera 121 to follow the skier 910 [FIG. )]. Subsequently, despite the fact that the lens of the camera 121 is rotated to the maximum, since the skier 910 continues to move to the left, if at least a portion of the skier 910 disappears from the preview, the controller 180 can control the production of an alarm through the mobile terminal 100. The embodiment described with reference to Figure 9 can be applied to a case of rotation of the camera lens 121 by following a subject (or an object included in a defined region) on a photographed image during a video shoot. Fig. 10 is a diagram for illustrating an example of tracking a subject by zooming in or out. For the sake of clarity of the following description, assume that a selected subject (or an object included in a defined region) is a skier 910. Refer to Figure 10 (a), where a skier 1010 moves, he may be unable to fully photograph the skier 1010 at a real angle of view. In particular, at least a portion of the skier 1010 disappears from a viewing image corresponding to the actual viewing angle. In this case, let us refer to Figure 10 (b), on which the controller widens the angle of view by zooming camera 121, thereby controlling skier 1010 so that it is fully included in the 10 preview image. Next, let us refer to Figure 10 (c), in which, despite the fact that the camera 121 is unable to zoom out, since the skier 1010 is still moving, if at least part of the skier 1010 disappears from the preview image, the controller 180 can control the generation of an alarm through the mobile terminal 100. In another example, in the case where the skier 101 moves away from the camera 121, the Controller 180 may control camera 121 to zoom in to maintain a subject size at a predetermined level. Once a subject has been selected on a preview image 20 inputted by the first camera or a specified region has been set on the preview image, the first camera is disabled and a user input aimed at activate the second camera can then be received. If the subject selected in the preview image inputted through the first camera or if an object included in the determined region is detected in a preview image inputted by the second camera, the controller 180 produces an alarm. and is then able to command that a photo or video be taken automatically. This is described in detail with reference to Fig. 11 as follows. Fig. 11 is a diagram for illustrating an example of a shooting method using a first camera and a second camera according to the present invention. For the sake of clarity of the following description, assume that a first camera and a second camera include the front camera 121a and the rear camera 121b, respectively. And, assume that the camera direction of the front camera 121a and the rear camera 121b are defined as shown in Fig. 11 (a). If the front camera 121a is activated, let us refer to FIG. 11 (b), on which the controller 180 can command a preview image, which is input through the front camera 121a, to be displayed by the 151. On the basis of user input to the preview image, the controller 180 can select a subject or define a specific region. For the sake of clarity of description, as in the example shown in Figure 11 (b), assume that the selected subject or an object included in the determined region is a user's face 1110. Then, let's refer to FIG. 11 (c), wherein, if a user input 10 to switch to the rear camera 121b is received, the controller 180 can control the display of a preview image, which is inputted through the camera rearward 121b, through the display unit 151. In this case, the user input unit to disable the front camera 121a and activate the rear camera 121b can include one of a gesture input , A touch input consisting of touching a button on the display unit 151, a manipulation of a physical button located on the mobile unit, a voice command and the like. For example, the user input unit for disabling the front camera 121a and activating the rear camera 121b may include a rotational gesture input of the mobile terminal 100 at a determined angle (for example, a rotation gesture input of the mobile terminal 100). mobile terminal 100 to 180 degrees). In this case, based on a detected signal from the detection unit 140, the controller 180 can determine whether the mobile terminal 100 is rotated at an angle greater than or equal to the determined angle. Referring to Fig. 11 (d), where, when the mobile terminal 100 is rotated or moved, a user face selected on the preview image of the front camera 121a is accidentally discovered on the image 121b, the controller 180 can control the production of an alarm via the mobile terminal 100. Subsequently, if the face of the user 1110 does not deviate from the 30 121b image of the rear camera for a predefined time, the controller 180 can control that a photo or video is taken automatically. [0047] The embodiment described with reference to the figure is applicable to a case of activation of the rear camera 121b initially and then activation of the camera before 121a later. In addition, although the controller 180 can select a subject on the preview image of the front camera 121a or define a determined region in response to a user input, the controller 180 is also able to define a subject or region determined at within the preview image of the camera before 121a automatically. When a plurality of subjects are selected on a preview image of a first camera or when a plurality of regions are selected, if a plurality of the subjects or all the objects included in each of the regions are discovered on a preview image of a second camera, the controller 180 can control the production of an alarm. Although a plurality of the subjects or objects are partially discovered, if the remainder of the subjects or objects are not discovered, the controller 180 may move the camera lens or change a zoom magnification of the camera, so that the second camera fully receives an image of a plurality of subjects or objects. This is described in detail with reference to Figs. 12A-12C as follows. Figs. 12A-12C are diagrams for illustrating another example of a method of shooting using a first camera and a second camera according to the present invention. For the sake of clarity of the following description, assume that a first camera and a second camera include the front camera 121a and the rear camera 121b, respectively. And, assume that the initial shooting directions of the front camera 121a and the rear camera 121b are set as shown in Fig. 12A (a). If the front camera 121a is activated, let us refer to FIG. 12A (b), on which the controller 180 can command a preview image, which is input through the front camera 121a, to be displayed by the 151. On the basis of user input to the preview image, the controller 180 can select a subject or define a specific region. For the sake of clarity of the description, as in the example shown in Figure 12A (b), assume that the selected subjects or objects included in the determined region are a user's face 1210 and a flag. [0048] Next, refer to FIG. 12A (c), in which, if a user input to switch to the rear camera 121b is received, the controller 180 can control the display of a preview image, which is input through the rear camera 121b, through the display unit 151. [0049] When the mobile terminal 100 is rotated or moved, it may happen that all the subjects 1210 and 1220 selected on the preview image of the front camera 121a are accidentally discovered in the preview image of the rear camera 121b. If this is the case, as mentioned in the previous description with reference to FIG. 11, the controller 180 controls that an alarm is generated via the mobile terminal 100 and also controls that a photo or video be taken. On the other hand, when the mobile terminal 100 is rotated or moved, it may happen that the subjects 1210 and 1220 selected in the preview image of the front camera 121a are partially discovered in the preview image of the rear camera 121b. by accident. In the example shown in Fig. 12A (d), the face of the character 1210 among a plurality of previously selected subjects 1210 and 1220 is only discovered from the preview image of the rear camera 121b. Once a portion 1210 among a plurality of previously selected topics 1210 and 1220 has been discovered, if it is not possible to discover the remainder of the subjects 1220 despite the lapse of a predefined time, the controller 180 can adjust a position of the camera lens or direct the camera to zoom out to discover the rest of the subject (s). For example, refer to Figure 12B, in which controller 180 can control the remainder of subject 1220 to appear in the preview image by moving the lens of rear camera 121b. For another example, refer to Figure 12C, on which the controller 180 can control the remainder of the subject 1220 to appear in the preview image by allowing the rear camera 121b to zoom out. If all of a plurality of subjects 1210 and 1220 appear on the preview image of the rear camera 12 lb, as mentioned in the preceding description with reference to Fig. 11, the controller 180 may control the production of an alarm. [0050] As in the examples shown in Fig. 12B and Fig. 12C, by adjusting the rear camera 121b so that a plurality of subjects appear on a preview image, the preview image of the rear camera 121b can configure the composition so that it is similar to that of a preview image of the camera before 121a. Subsequently, if a plurality of subjects do not deviate from the preview image of the rear camera 121b for a predefined time, the controller 180 may control that a photo or video be taken automatically. The embodiment described with reference to FIGS. 12A to 12C is applicable to a case of activation of the rear camera 121b initially then to the activation of the rear camera 121a later. In addition, although FIG. 12A (b) shows an example where a subject on the preview image of the front camera 121a is selected in response to a user's touch input, the controller 180 is also able to select a subject inside the preview image of the camera before 121a automatically. According to the examples shown in Fig. 11 and Fig. 12, based on a subject or object selected from an image inputted through a front camera, a photo may be taken automatically through a rear camera. In contrast to the examples shown in Fig. 11 and Fig. 12, based on a subject or object selected from a previously taken picture (e.g., a picture taken through a first camera), the controller 180 can control that a photo be taken automatically through a second camera. As in the previous embodiments described with reference to FIG. 11 and FIG. 12, in the case of taking pictures with a first camera and a second camera, when the first camera is activated, the controller 180 can control whether a shooting mode or a focus value of the camera is applied exactly to a case where the second camera is activated. In this case, the shooting mode may mean a mode in which a camera focus value is set by default to suit a situation, such as a character shooting mode, a camera mode, quick shooting, a landscape shooting mode, a night shooting mode or the like. The camera focus value can mean values that are adjustable (for example, aperture, shutter speed, exposure, image quality, etc.) for taking a photo or video. [0051] By applying the embodiments described with reference to FIG. 11 and FIG. 12, the controller 180 can control the continuation of an alarm production and a shooting according to a subject selected from a photo taken previously or an object included in a region selected in a previously taken photo appears or not on a preview image. [0052] For example, after a selected subject has been selected in a previously taken picture, if a selected subject on a preview image inputted through a camera is selected, the controller produces an alarm and is able to control that a photo is taken. Thus, the controller 180 can take a new one by applying the same shooting mode or focus value of the previously taken picture. In this case, the previously taken picture can be taken through the first camera and the recently taken picture can be based on an image input through the second camera, whereby the present invention is unrestricted. In another example, if a preview image is determined to configure a composition similar to that of a previously taken picture, it can command that a picture be taken automatically. This is described in detail with reference to Fig. 13 as follows. Fig. 13 is a diagram for illustrating an embodiment of taking a photo having a composition similar to that of a photo taken previously according to the present invention. For the sake of clarity of the following description, assume that a previously taken picture is identical to that shown in Fig. 13 (a). Referring to Figure 13, on which the controller 180 can extract information about a subject from a previously taken picture. In this case, the subject information may include at least one of subject identification information, position information of the subject on the photograph, distance information between subjects in the case of the subject. a presence of a plurality of included subjects, and the like. [0053] For example, from the picture shown in Fig. 13 (a), the controller 180 may be able to extract subjects such as a user's face, a flag and the like and is also able to extract at least one of a position information of the user's face on the picture, a position information of the flag on the picture and a distance information concerning a distance between the face of the user and the flag. Next, refer to Figure 13 (b), in which, since the camera is enabled, a preview image can be displayed through the display unit. The controller 180 can determine whether the preview image displayed through the display unit configures the same composition of the previously taken picture. In particular, the controller 180 can determine whether the preview image and the previously taken photo both configure the same composition based on at least one of the fact that a photo taken through the front camera is included. on a preview image inputted by the rear camera, since a position of a subject is located at a determined threshold distance from a position on a photograph, because a difference between a size of a subject and a size on a photo corresponds to an error range, and that a difference between each distance between subjects among a plurality of subjects and a distance between subjects on a photo corresponds to an error range. Since it is not the flag but the face of the user that is discovered on the preview image shown in Fig. 13 (b), the controller 180 can determine that the same composition is not configured. Referring to FIG. 13 (c), in which, when the mobile terminal 100 is moved, both a human face and a flag can be accidentally discovered. In addition, the controller 180 can determine whether the preview image configures the same composition of the previously taken photo by further examining a position of the human face on the preview image, a position of the flag on the image of the preview image. preview, a distance between the human face and the flag and the like. If the preview image shown in Fig. 13 (c) is determined to configure the same composition of the previously taken picture, the controller 180 may produce an alarm. In addition, if the same composition is maintained for a predetermined time, the controller 180 can control that a photo is taken automatically. According to the embodiment shown in Fig. 13, a previously taken picture and a recently taken picture can be taken using different cameras, respectively [for example, the previously taken picture is taken through the first camera. , while the newly taken picture is created based on a preview image inputted through the second camera], whereby the present invention is unrestricted. In this case, the controller 180 can take a recent photo by applying the same shooting mode or focus value as the previously taken picture. According to the descriptions with reference to Figures 11 to 13, after an alarm has been produced, a photo or video is taken. Contrary to the descriptions, the controller 180 may be able to take a photo or video by skipping the alarm output step. Once a specified region has been set on a preview image inputted through the camera 121, if an object enters the determined region, the controller 180 generates an alarm and is able to control a photo or video are taken automatically. This is described in detail with reference to Fig. 14 and Fig. 15 as follows. Fig. 14 is a diagram for illustrating an embodiment of automatically taking a picture in the case of an entry of a new object in a region defined according to the present invention. Referring to Figure 14 (a), on which, based on a user touch input, the controller 180 may define a determined region 1410 on a preview image. As mentioned in the foregoing description with reference to FIG. 5A, the defined region 1410 can be identified visually. If a new object 1420 appears in the defined region 1410, the controller 180 generates an alarm and is also able to control whether a photo or video is taken automatically. As in the example shown in Fig. 14 (b), if the sun 1420 appears in the defined region 1410, the controller 180 can control the production of an alarm through the mobile terminal 100. [0054] If the object 1420 appears recently in the defined region 1410 or if the recently appeared object 1420 does not deviate from the preview image for a predefined time, the controller 180 can control a photo or video is taken automatically. [0055] When an object has already been included in a region defined by a user, if an object appears recently in the defined region except in a case where the object included in the defined region reappears, the controller can control that a photo or video be taken. Fig. 15 is a diagram for illustrating another embodiment of taking a photo automatically in the case of an entry of a new object in a region defined according to the present invention. Referring to Figure 15 (a), on which, based on a user touch input, the controller 180 may define a determined region 1510 on a preview image. As mentioned in the foregoing description with reference to FIG. 5A, the defined region 1410 can be identified visually. According to the example shown in Fig. 15 (a), a goalkeeper has already been included in the defined region 1510. As in the example shown in Fig. 15 (b), if the goalkeeper 1520 disappears from the defined region 1510 and then reappears in the defined region 1510, the controller 180 can not take the picture or video. However, if the goalkeeper 1510 disappears or reappears, as mentioned in the above description with reference to FIG. 5B, an alarm can be generated via the mobile terminal 100. On the other hand, if we look at FIG. c), if a new object (eg, a football) 1530 appears in the defined region 1510, the controller 180 may produce an alarm. If the object 1530 appears recently in the defined region 1510 or if the recently appeared object 1530 does not deviate from the preview image for a predefined time, the controller 180 can control that a photo or video are taken automatically. According to the descriptions with reference to Fig. 14 and Fig. 15, if a new object appears in a defined region, an alarm is generated and a picture or video is then taken. On the other hand, contrary to the descriptions, the controller 180 may be able to take a picture or video by skipping the production step of the alarm. Accordingly, the embodiments of the present invention provide various effects and / or features. [0056] First, the present invention can provide a mobile terminal for improving user comfort. In particular, if a subject deviates from a preview image, the present invention may provide a mobile terminal and a control method thereof for producing an alarm. [0057] It will be understood by those skilled in the art that the present invention may be specified in another form (s) without departing from the spirit and scope of the invention. Moreover, the methods described above can be implemented on a medium recorded by a program in the form of processor readable codes. [0058] The processor readable medium may include all kinds of recording devices in which processor readable data is stored. Examples of such processor readable media may include ROM, RAM, CD-ROM, magnetic tapes, floppy disks, optical data storage elements, and the like, for example, and include carrier wave type (for example, transmission over the Internet). It will be understood by those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit and scope of the invention. Thus, it is understood that the present invention covers the modifications and variations of the present invention provided that they fall within the scope of the appended claims and their equivalents.
权利要求:
Claims (17) [0001] REVENDICATIONS1. A mobile terminal (100) comprising: a camera (121); a display unit (151) configured to display a preview image inputted through the camera; and a controller (180) configured to: select an object in the preview image in response to user input to touch the object or define a specific region including the object; and produce an alarm when the object exits the preview image. [0002] The mobile terminal (100) of claim 1, wherein if the subject remains in the preview image or the specific region for a specified duration, the controller (180) is configured to automatically control the camera to take a picture or video. [0003] The mobile terminal (100) according to claim 1, wherein the controller (180) is configured to adjust a camera (121) shooting direction by following a motion of the object. [0004] The mobile terminal (100) according to claim 1, wherein if it is predicted that the object is coming out of the preview image, the controller (180) is configured to control the camera (121) to zoom back to extend an angle of view. [0005] A mobile terminal (100) according to claim 1, wherein while a video is recorded, if the object is output from the preview image or the specific region, the controller (180) is configured to control the camera (121) to pause recording of the video. [0006] A mobile terminal (100) according to claim 5, wherein if the object reappears in the preview image or the specific region, the controller (180) is configured to control the camera (121) to resume recording from the video. [0007] The mobile terminal (100) of claim 1, wherein if a new object appears in the preview image or the specific region, the controller (180) is configured to produce the alarm. [0008] The mobile terminal (100) of claim 1, wherein while a video is recorded, if a new object appears in the preview image or the specific region, the controller (180) is configured to control the camera (121) to pause recording of the video. [0009] The mobile terminal (100) of claim 8, wherein if the new object disappears from the preview image or the specific region, the controller (180) is configured to control the camera (121) to resume operation. recording of the video. [0010] The mobile terminal (100) of claim 1, wherein if a new object appears in the specific region, the controller (180) is configured to control the camera (121) to take a photo or video automatically. [0011] The mobile terminal (100) of claim 1, wherein the mobile terminal (100) comprises a first camera (121) and a second camera (122), and wherein one of the first camera (121) and the second camera (122) is on one front side of the mobile terminal (100), and the other is on a back side of the mobile terminal (100). [0012] The mobile terminal (100) according to claim 11, wherein if the object is selected via a first preview image inputted through the first camera (121) and thereafter, the object is discovered from a second preview image input via the second camera (122), the controller is configured to control one of the first camera (121) or the second camera (122) to take a picture or video automatically. [0013] 13. Mobile terminal (100) according to claim 12, wherein a shooting mode, defined during the activation of the first camera (121), is maintained even if the second camera (122) is activated. [0014] A mobile terminal (100) comprising: a camera (121); a display unit (151) configured to display a preview image inputted through the camera (121); and a controller (180) configured to retrieve identification information of an object from a previously taken photo, and if the identification information retrieved from the object matches the preview image input via the camera (121), for controlling the camera (121) to take a new picture automatically. 15 [0015] The mobile terminal (100) according to claim 14, wherein the new photo has at least one value equal to at least one of a shooting mode and a shooting adjustment mode value when the photograph taken previously was taken. 20 [0016] The mobile terminal (100) according to claim 14, wherein the camera (121) comprises a first camera (121) and a second camera (122), wherein the previously taken picture is taken through the first camera ( 121), and wherein the new picture is taken through the second camera (122). 25 [0017] A method of controlling a mobile terminal (100), comprising: displaying a preview image input via a camera (121); selecting an object in response to a user input to touch the object or define a specific region including the object; and producing an alarm when the object exits the preview image.
类似技术:
公开号 | 公开日 | 专利标题 FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3022367A1|2015-12-18| FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021132A1|2015-11-20| US10021297B2|2018-07-10|Mobile terminal and control method thereof FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL CN105549839B|2020-09-04|Mobile terminal and control method thereof FR3026201A1|2016-03-25| US10031337B2|2018-07-24|Glasses-type terminal and method for controlling the same FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3021425A1|2015-11-27| FR3021134A1|2015-11-20|MOBILE TERMINAL FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3019665A1|2015-10-09| FR3046470B1|2019-11-08|MOBILE TERMINAL FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME CN106899801B|2020-01-14|Mobile terminal and control method thereof FR3021135A1|2015-11-20|
同族专利:
公开号 | 公开日 US20150334293A1|2015-11-19| KR20150131815A|2015-11-25| KR102224480B1|2021-03-08| FR3021132B1|2018-09-28| US9473697B2|2016-10-18|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20040258152A1|2003-06-19|2004-12-23|Herz William S.|System and method for using motion vectors for object tracking| US20120038796A1|2010-08-12|2012-02-16|Posa John G|Apparatus and method providing auto zoom in response to relative movement of target subject matter| US20120243730A1|2011-03-22|2012-09-27|Abdelkader Outtagarts|Collaborative camera services for distributed real-time object analysis| US20130162837A1|2011-12-21|2013-06-27|Nokia Corporation|Image Capture| US20140092242A1|2012-09-28|2014-04-03|Avaya Inc.|System and method to identify visitors and provide contextual services| US7724133B2|2006-07-27|2010-05-25|S. R. Smith Llc|Pool video safety, security and intrusion surveillance and monitoring system| KR20080089839A|2007-04-02|2008-10-08|삼성테크윈 주식회사|Apparatus and method for photographing image| KR101394289B1|2007-06-13|2014-05-13|삼성전자주식회사|Device and method for image photographing| KR20090040119A|2007-10-19|2009-04-23|엘지전자 주식회사|Method for performing ranging procedure| US9143744B2|2012-05-04|2015-09-22|Colorvision International, Inc.|Park guest-activated image capture system for theme parks and associated methods| KR20140120806A|2013-04-04|2014-10-14|주식회사 케이티|Radio Link Failure detection on small cell|USD755809S1|2013-12-30|2016-05-10|Samsung Electronics Co., Ltd.|Display screen or portion thereof with graphical user interface| EP3210437B1|2014-10-16|2020-05-27|LG Electronics Inc.|Watch type terminal and method for controlling the same| KR102225947B1|2014-10-24|2021-03-10|엘지전자 주식회사|Mobile terminal and method for controlling the same| US10042031B2|2015-02-11|2018-08-07|Xerox Corporation|Method and system for detecting that an object of interest has re-entered a field of view of an imaging device| US20160360118A1|2015-06-04|2016-12-08|String Theory, Inc.|Smartphone camera user interface| EP3518522B1|2016-10-25|2022-01-26|Huawei Technologies Co., Ltd.|Image capturing method and device| JP6836193B2|2017-09-27|2021-02-24|カシオ計算機株式会社|Measuring equipment, control methods and programs| CN108600628A|2018-05-02|2018-09-28|腾讯科技(深圳)有限公司|Video capture method, apparatus, terminal and readable medium|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 | 2017-05-30| PLFP| Fee payment|Year of fee payment: 3 | 2018-01-19| PLSC| Publication of the preliminary search report|Effective date: 20180119 | 2018-05-29| PLFP| Fee payment|Year of fee payment: 4 | 2019-04-10| PLFP| Fee payment|Year of fee payment: 5 | 2020-04-08| PLFP| Fee payment|Year of fee payment: 6 | 2021-04-09| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR20140059098|2014-05-16| KR1020140059098A|KR102224480B1|2014-05-16|2014-05-16|Mobile terminal and controlling method thereof| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|